8 research outputs found

    Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    Full text link
    In this paper we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilised to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large scale computer. Photons in this system are continually recycled back into the preparation network, allowing for a arbitrarily deep 3D cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high frequency, deterministic photon sources.Comment: 19 pages, 13 Figs (2 Appendices with additional Figs.). Comments welcom

    From quantum fusiliers to high-performance networks

    Full text link
    Our objective was to design a quantum repeater capable of achieving one million entangled pairs per second over a distance of 1000km. We failed, but not by much. In this letter we will describe the series of developments that permitted us to approach our goal. We will describe a mechanism that permits the creation of entanglement between two qubits, connected by fibre, with probability arbitrarily close to one and in constant time. This mechanism may be extended to ensure that the entanglement has high fidelity without compromising these properties. Finally, we describe how this may be used to construct a quantum repeater that is capable of creating a linear quantum network connecting two distant qubits with high fidelity. The creation rate is shown to be a function of the maximum distance between two adjacent quantum repeaters.Comment: 2 figures, Comments welcom

    Simulating chemistry efficiently on fault-tolerant quantum computers

    Get PDF
    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. Here we consider methods to make proposed chemical simulation algorithms computationally fast on fault-tolerant quantum computers in the circuit model. Fault tolerance constrains the choice of available gates, so that arbitrary gates required for a simulation algorithm must be constructed from sequences of fundamental operations. We examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay-Kitaev algorithm [C.M. Dawson and M.A. Nielsen, \emph{Quantum Inf. Comput.}, \textbf{6}:81, 2006]. For a given approximation error ϵ\epsilon, arbitrary single-qubit gates can be produced fault-tolerantly and using a limited set of gates in time which is O(logϵ)O(\log \epsilon) or O(loglogϵ)O(\log \log \epsilon); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for Lithium hydride.Comment: 33 pages, 18 figure

    Architectural design for a topological cluster state quantum computer

    Full text link
    The development of a large scale quantum computer is a highly sought after goal of fundamental research and consequently a highly non-trivial problem. Scalability in quantum information processing is not just a problem of qubit manufacturing and control but it crucially depends on the ability to adapt advanced techniques in quantum information theory, such as error correction, to the experimental restrictions of assembling qubit arrays into the millions. In this paper we introduce a feasible architectural design for large scale quantum computation in optical systems. We combine the recent developments in topological cluster state computation with the photonic module, a simple chip based device which can be used as a fundamental building block for a large scale computer. The integration of the topological cluster model with this comparatively simple operational element addresses many significant issues in scalable computing and leads to a promising modular architecture with complete integration of active error correction exhibiting high fault-tolerant thresholds.Comment: 14 Pages, 8 Figures, changes to the main text, new appendix adde
    corecore